# Efficient language model
Bitnet B1 58 Xl Q8 0 Gguf
MIT
BitNet b1.58 is a large language model with 1.58-bit quantization. It reduces the computational resource requirements by lowering the weight precision while maintaining performance close to that of a full-precision model.
Large Language Model
Transformers

B
BoscoTheDog
326
7
Openelm 3B Instruct
OpenELM is a set of open-source and efficient language models. It adopts a hierarchical parameter allocation strategy to improve model accuracy and includes pre-trained and instruction-tuned versions with 270 million to 3 billion parameters.
Large Language Model
Transformers

O
apple
8,716
333
Openelm 1 1B Instruct
OpenELM is a set of open-source efficient language models that use a hierarchical scaling strategy to efficiently allocate parameters in each layer of the Transformer model, thereby improving model accuracy.
Large Language Model
Transformers

O
apple
1.5M
62
Featured Recommended AI Models